Skip Navigation
Skip to contents

JEEHP : Journal of Educational Evaluation for Health Professions

OPEN ACCESS
SEARCH
Search

Search

Page Path
HOME > Search
2 "Brett Vaughan"
Filter
Filter
Article category
Keywords
Publication year
Authors
Research article
Learning through multiple lenses: analysis of self, peer, nearpeer, and faculty assessments of a clinical history-taking task in Australia  
Kylie Fitzgerald, Brett Vaughan
J Educ Eval Health Prof. 2018;15:22.   Published online September 18, 2018
DOI: https://doi.org/10.3352/jeehp.2018.15.22
  • 23,419 View
  • 287 Download
  • 4 Web of Science
  • 5 Crossref
AbstractAbstract PDFSupplementary Material
Purpose
Peer assessment provides a framework for developing expected skills and receiving feedback appropriate to the learner’s level. Near-peer (NP) assessment may elevate expectations and motivate learning. Feedback from peers and NPs may be a sustainable way to enhance student assessment feedback. This study analysed relationships among self, peer, NP, and faculty marking of an assessment and students’ attitudes towards marking by those various groups.
Methods
A cross-sectional study design was used. Year 2 osteopathy students (n= 86) were invited to perform self and peer assessments of a clinical history-taking and communication skills assessment. NPs and faculty also marked the assessment. Year 2 students also completed a questionnaire on their attitudes to peer/NP marking. Descriptive statistics and the Spearman rho coefficient were used to evaluate relationships across marker groups.
Results
Year 2 students (n= 9), NPs (n= 3), and faculty (n= 5) were recruited. Correlations between self and peer (r= 0.38) and self and faculty (r= 0.43) marks were moderate. A weak correlation was observed between self and NP marks (r= 0.25). Perceptions of peer and NP marking varied, with over half of the cohort suggesting that peer or NP assessments should not contribute to their grade.
Conclusion
Framing peer and NP assessment as another feedback source may offer a sustainable method for enhancing feedback without overloading faculty resources. Multiple sources of feedback may assist in developing assessment literacy and calibrating students’ self-assessment capability. The small number of students recruited suggests some acceptability of peer and NP assessment; however, further work is required to increase its acceptability.

Citations

Citations to this article as recorded by  
  • The extent and quality of evidence for osteopathic education: A scoping review
    Andrew MacMillan, Patrick Gauthier, Luciane Alberto, Arabella Gaunt, Rachel Ives, Chris Williams, Dr Jerry Draper-Rodi
    International Journal of Osteopathic Medicine.2023; 49: 100663.     CrossRef
  • History and physical exam: a retrospective analysis of a clinical opportunity
    David McLinden, Krista Hailstone, Sue Featherston
    BMC Medical Education.2023;[Epub]     CrossRef
  • How Accurate Are Our Students? A Meta-analytic Systematic Review on Self-assessment Scoring Accuracy
    Samuel P. León, Ernesto Panadero, Inmaculada García-Martínez
    Educational Psychology Review.2023;[Epub]     CrossRef
  • Evaluating the Academic Performance of Mustansiriyah Medical College Teaching Staff vs. Final-Year Students Failure Rates
    Wassan Nori, Wisam Akram , Saad Mubarak Rasheed, Nabeeha Najatee Akram, Taqi Mohammed Jwad Taher, Mustafa Ali Kassim Kassim, Alexandru Cosmin Pantazi
    Al-Rafidain Journal of Medical Sciences ( ISSN 2789-3219 ).2023; 5(1S): S151.     CrossRef
  • History-taking level and its influencing factors among nursing undergraduates based on the virtual standardized patient testing results: Cross sectional study
    Jingrong Du, Xiaowen Zhu, Juan Wang, Jing Zheng, Xiaomin Zhang, Ziwen Wang, Kun Li
    Nurse Education Today.2022; 111: 105312.     CrossRef
Brief report
Reliability of a viva assessment of clinical reasoning in an Australian pre-professional osteopathy program assessed using generalizability theory  
Brett Vaughan, Paul Orrock, Sandra Grace
J Educ Eval Health Prof. 2017;14:1.   Published online January 20, 2017
DOI: https://doi.org/10.3352/jeehp.2017.14.1
  • 46,322 View
  • 386 Download
  • 3 Web of Science
  • 3 Crossref
AbstractAbstract PDF
Clinical reasoning is situation-dependent and case-specific; therefore, assessments incorporating different patient presentations are warranted. The present study aimed to determine the reliability of a multi-station case-based viva assessment of clinical reasoning in an Australian pre-registration osteopathy program using generalizability theory. Students (from years 4 and 5) and examiners were recruited from the osteopathy program at Southern Cross University, Lismore, Australia. The study took place on a single day in the student teaching clinic. Examiners were trained before the examination. Students were allocated to 1 of 3 rounds consisting of 5 10-minute stations in an objective structured clinical examination-style. Generalizability analysis was used to explore the reliability of the examination. Fifteen students and 5 faculty members participated in the study. The examination produced a generalizability coefficient of 0.53, with 18 stations required to achieve a generalizability coefficient of 0.80. The reliability estimations were acceptable and the psychometric findings related to the marking rubric and overall scores were acceptable; however, further work is required in examiner training and ensuring consistent case difficulty to improve the reliability of the examination.

Citations

Citations to this article as recorded by  
  • The extent and quality of evidence for osteopathic education: A scoping review
    Andrew MacMillan, Patrick Gauthier, Luciane Alberto, Arabella Gaunt, Rachel Ives, Chris Williams, Dr Jerry Draper-Rodi
    International Journal of Osteopathic Medicine.2023; 49: 100663.     CrossRef
  • Acceptability of the 8-case objective structured clinical examination of medical students in Korea using generalizability theory: a reliability study
    Song Yi Park, Sang-Hwa Lee, Min-Jeong Kim, Ki-Hwan Ji, Ji Ho Ryu
    Journal of Educational Evaluation for Health Professions.2022; 19: 26.     CrossRef
  • Interesting statistics regarding the papers published in Journal of Educational Evaluation for Health Professions in 2017
    Yera Hur
    Journal of Educational Evaluation for Health Professions.2017; 14: 36.     CrossRef

JEEHP : Journal of Educational Evaluation for Health Professions